# Literary content creation
Gpt2 Spanish Medium
MIT
GPT2-Spanish Version is a language generation model trained from scratch using 11.5GB of Spanish text, with a specially trained Byte Pair Encoding (BPE) tokenizer.
Large Language Model Supports Multiple Languages
G
DeepESP
221
9
Gpt2 Spanish
MIT
A language generation model trained on 11.5GB of Spanish text, using the same parameter configuration as OpenAI GPT-2 small version
Large Language Model Supports Multiple Languages
G
DeepESP
2,917
37
Featured Recommended AI Models